188 research outputs found

    Introducing commencing students to “being a scientist” – A review of a new compulsory academic literacies course

    Get PDF
    The Bachelor of Science program at The University of Adelaide was reviewed by an external expert panel in 2007. One of the key recommendations to come from this review was the requirement to introduce a common compulsory course in the first year of study focussing on the development of academic literacies. This new course (‘Principles & Practice of Science I’) was offered for the first time in Semester 1, 2011. Students were introduced to the idea of what it means to “be a scientist” through discussion of the broad array of scientific endeavour, the integrated nature of scientific disciplines, the importance of scientific process and critical thinking. Academic writing was used as the key vehicle for assessment of the desired learning outcomes. This presentation will focus on some of the tools and activities used in ‘Principles & Practice of Science I’ to develop student’s academic writing skills

    More than just lab reports – Introducing students to academic literacies

    Get PDF
    The ALTC Learning & Teaching Academic Standards for Science (Jones & Yates, 2011) have been widely endorsed (including by the Australian Council of Deans of Science) after extensive consultation and have the potential to serve as markers for Learning Standards in a more highly regulated environment under TEQSA. The Threshold Learning Outcomes do however present a number of challenges on the ‘Understanding Science’ and ‘Communication’ fronts (TLOs 1 & 4) as these outcomes are often not explicitly supported in current curricula. The Bachelor of Science program at The University of Adelaide was reviewed by an external expert panel in 2007. One of the key recommendations to come from this review was the requirement to introduce a common compulsory course in the first year of study focussing on the development of academic literacies. This new course (‘Principles & Practice of Science I’) was offered for the first time in Semester 1, 2011. The TLOs 1 & 4 of the Science LTAS were key drivers in the development of this course. Students are introduced to the idea of what it means to “be a scientist” through discussion of the broad array of scientific endeavour, the integrated nature of scientific disciplines, the importance of scientific process and critical thinking. Academic writing was used as the key vehicle for assessment of the desired learning outcomes. In this presentation the curriculum construct and outcomes from the first two cohorts of students will be reviewed. Jones, S. & Yates, B. (2011). Science Learning and Teaching Academic Standards Statement [PDF]. Retrieved from http://www.olt.gov.au/resource-learning-and-teaching-academic-standards-science-201

    Knowing what they know is half the battle: Investigating student conceptions of stoichiometry

    Get PDF
    Stoichiometry is a core topic in high school chemistry, yet it is often one of the most daunting for students as the extensive range of chemical and mathematical concepts required for understanding make this topic complex and challenging for student learning (Wink & Ryan, 2019; Ramful & Narod, 2014). The long-standing interest in research into student learning has afforded conceptions, misconceptions and alternative conceptions as vehicles through which researchers can investigate student understanding (Taber, 2015; Gilbert & Watts, 1983). Two-tier diagnostic instruments have been used extensively in previous research for this purpose where they have been useful in obtaining data regarding students’ conceptions of the relevant subject matter (Treagust, 1988; Soeharto et al., 2019). This study involved development of a two-tier diagnostic instrument to investigate the interplay between stoichiometry and mathematics. There were four steps to the development and refinement of the two-tier diagnostic instrument: defining the scope of the project, constructing the first tier, obtaining alternative conceptions, and developing the second tier. With multiple refinements during the design phase, the development of the instrument was informed by face validation from ten ‘critical friends’ alongside written responses from both undergraduate and graduate students (n = 24) and think-aloud interviews (n = 12). The final version of the two-tier diagnostic instrument contains 21 items comprising 14 chemistry items and 7 mathematics items. Development of a two-tier diagnostic instrument that addresses two interrelated subject matter areas is a novel application of the two-tier diagnostic approach and the implications of this will be presented.  REFERENCES Gilbert, J. K., & Watts, D. M. (1983). Concepts, Misconceptions and Alternative Conceptions: Changing Perspectives in Science Education. Studies in Science Education, 10(1), 61-98. Ramful, A., & Narod, F. B. (2014). Proportional reasoning in the learning of chemistry: levels of complexity, Mathematics Education Research Journal, 26(1), 25-46. Soeharto, S., Csapó, B., Sarimanah, E., Dewi, F., & Sabri, T. (2019). A Review of Students’ Common Misconceptions in Science and Their Diagnostic Assessment Tools. Jurnal Pendidikan IPA Indonesia. 8(2), 247-266. Taber, K. S. (2015). Alternative conceptions/frameworks/misconceptions. Encyclopedia of Science Education, pp 37-41. Treagust, D. F. (1988). Development and use of diagnostic tests to evaluate students’ misconceptions in science. International Journal of Science Education, 10(2), 159-169. Wink, D. J., & Ryan, S. A. C. (2019). The Logic of Proportional Reasoning and Its Transfer into Chemistry. It’s Just Math: Research on Students’ Understanding of Chemistry and Mathematics, pp 157-171

    Assessing standards in science

    Get PDF
    Internationally there is recognition that critical thinking, analytical reasoning, problem solving and communication) are critical for citizenship in the 21st century and assessment of “quality” and “standards” is increasingly on the agenda (e.g. AHELO, QAA, Tuning project). In 2011, the Australian government legislated for a Quality and Standards framework (by formation of TEQSA – the Tertiary Education Quality & Standards Agency) which will regulate tertiary education against an agreed set of standards developed by the Higher Education Standards panel. Prior to this in 2010, the Organisation of Economic Co‐Operation and Development (OECD) launched a new initiative, the Assessment of Higher Education Learning Outcomes (AHELO), to measure learning outcomes of graduates across OECD’s thirty‐one member countries. Also at the same time the Australian Government through the Australian Learning and Teaching Council funded a project to define the threshold learning outcomes in science as part of the development of the Learning and Teaching Academic Standards project. The threshold learning outcomes for science describe a series of minimum standards expected of a science graduate. Although ACER (Australian Council for Educational Research) has been engaged by the OECD to develop approaches to assess both generic and discipline-specific learning outcomes in Economics and Engineering for the AHELO project (http://www.acer.edu.au/research/he/assessment-of-higher-education-learning-outcomes-ahelo/) and similarly in the US the Collegiate Learning Assessment (CLA, http://www.collegiatelearningassessment.org/index.html) will be used to assess the critical thinking skills of graduates and benchmark across institutions across disciplines

    Factors underpinning student perceptions of laboratory experiences

    Get PDF
    Background Survey data gathered as part of the Advancing Science by Enhancing Learning in the Laboratory (ASELL) project and its predecessors have been used previously to draw correlations between student perceptions of different aspects of laboratory-based activities and their perceived overall learning experience (Barrie, Bucat, Buntine, Burke da Silva, Crisp, George, Jamie, Kable, Lim, Pyke, Read, Sharma and Yeung, 2015). However, typical past analyses have involved the application of scoring techniques to ordered categorical response data, conflating student dependent and student independent contributions to student responses. Rasch modeling techniques provide an opportunity to control for the biases of individual students, revealing the more sample independent correlations in student perceptions which can be used to inform teaching practice. Particularly, the Linear Logistic Test Model (Fischer, 1995) is capable of expressing sample independent measures for each survey item as a linear combination of more basic factors of the experience. Aims The aim of this research was to derive a Linear Logistic Test Model for the ASELL Student Learning Experience (ASLE) survey, expressing “overall learning experience” as a linear combination of more basic factors of the learning experience. Methods A data set of 128,881 individual data points provided by over 9000 students in response to the ASLE survey, gathered from 29 practical activities run from 2011 to 2015 was input into a Rasch model, extracting student independent measures of quality for each experiment. These student independent measures were subjected to factor analysis, subsequently converting the results into a Linder Logistic Test Model of the ASLE survey data. Number of factors extracted was determined by balancing the parsimony of the model with the proportion of observed data variance explained, using the corrected Akaike Information Criterion (Burnham & Anderson, 2004). Results The final Linear Logistic Test Model reveals six major identifiable contributors to the laboratory learning experience. In descending order of impact on responses, these factors are the perceived connection to lecture theory, the quality of instructional material provided, understanding of theory through collaboration with others, the development of data interpretation skills, independent learning and the reliance on or appreciation for the demonstrator. A large component of “overall learning experience” appears to be due to aspects not addressed by ASLE survey items. The model yields equations for facets of the laboratory learning experience targeted by the ASLE survey, such as the equation for “overall learning experience” below (Equation 1). δ_(14 (overall learning experience)) = [■(-2@2@0@1@1@2@5)]⋅[■(〖 η〗_(theory focus)@〖 η〗_instructions@η_(collaborative understanding)@η_(data interpretation)@η_(independent learning)@η_demonstrators@η_(unexplained overall) )] (1) Similar equations are also obtained for other items of the survey, revealing models for fostering aspects of the experience such as student interest, increased understanding and development of technical skills. Conclusions Equations comprising the Linear logistic Test Model have a range of pedagogical implications for the structure of laboratory learning activities. Notably, increased understanding appears to be irrelevant to perceived “overall learning experience”, raising questions as to the consequential validity of using student response data to drive design of learning activities. A general theme of conflict between student preferences and attainment of learning objectives is recognized. References Barrie, S. C., R. B. Bucat, M. A. Buntine, K. Burke da Silva, G. T. Crisp, A. V. George, I. M. Jamie, S. H. Kable, K. F. Lim, S. M. Pyke, J. R. Read, M. D. Sharma & A. Yeung (2015). Development, Evaluation and Use of a Student Experience Survey in Undergraduate Science Laboratories: The Advancing Science by Enhancing Learning in the Laboratory Student Laboratory Learning Experience Survey. International Journal of Science Education, 37(11), 1795-1814. Burnham, K. P. & Anderson, D. R. (2004). Multimodel Inference: Understanding AIC and BIC in Model Selection. Sociological Methods & Research, 33(2), 261-304. Fischer, G. H. (1995). The linear logistic test model. Rasch models (pp. 131-155): Springer

    Advancing science by enhancing learning in the laboratory (ASELL)

    Full text link
    Final report of the the&nbsp;Advancing Science by Enhancing Learning in the Laboratory (ASELL) project.&nbsp;Most researchers agree that the laboratory experience ranks as a significant factor that influences students&rsquo; attitudes to their science courses. Consequently, good laboratory programs should play a major role in influencing student learning and performance. The laboratory program can be pivotal in defining a student\u27s experience in the sciences, and if done poorly, can be a major contributing factor in causing disengagement from the subject area. The challenge remains to provide students with laboratory activities that are relevant, engaging and offer effective learning opportunities.The Advancing Science by Enhancing Learning in the Laboratory (ASELL) project has developed over the last 10 years with the aim of improving the quality of learning in undergraduate laboratories, providing a validated means of evaluating and improving the laboratory experience of students, and effective professional development for academic staff. After successful development in chemistry and trials using the developed principles in physics and biology, the project, with ALTC funding, has now expanded to include those disciplines.The launching pad for ASELL was a multidisciplinary workshop held in Adelaide in April, 2010. This workshop involved 100 academics and students, plus 13 Deans of Science (or delegates), covering the three enabling sciences of biology, chemistry and physics. Thirty-nine undergraduate experiments were trialled over the three days of the workshop. More importantly, professional development in laboratory education was developed in the 42 academic staff that attended the workshop.Following the workshop, delegates continued to evaluate, develop and improve both individual experiments and whole laboratory programs in their home institutions, mentored by the ASELL Team. Some highlights include: - more than 15,000 student surveys carried out by delegates during 2010/11- 10 whole lab programs were surveyed by delegates- 4 new ASELL-style workshops, conducted by ASELL-trained delegates were run in 2010/11- more than 100 ASELL-tested experiments available on the website (www.asell.org)- ASELL workshops conducted in Philippines, Ireland in 2010, and planned in the USA and Thailand for 2011- significant improvement in student evaluation of whole laboratory programs and individual experiments measured in universities using the ASELL approach- high profile of ASELL activities in the Australian Council of Deans of Science (ACDS)- research project on the misconceptions of academic staff about laboratory learning completed- significant research on student learning in the laboratory, and staff perceptions of student learning have been carried out during 2010/11- research results have been benchmarked against staff and students in the USA.The biggest unresolved issue for ASELL is one of sustainability in the post-ALTC funding era. ASELL will make a series of recommendations to the ACDS, but the future of the program depends, to a large part, on how the ACDS responds. <br /

    Comparing student cohorts between years in first-year chemistry assessments

    Get PDF
    BACKGROUND It is not uncommon to hear the sentiment that student cohorts are less capable than the cohorts that came before them. However, the evidence for this sentiment is severely lacking, as it is challenging to compare student cohorts from different years due to changes in how the different cohorts of students were assessed and changes to the overall structure and delivery of the courses across years. The University of Adelaide has a large bank of multiple-choice (MCQ) assessment results from assessments undertaken in its four first-year chemistry courses that can be used to address this issue, as most of the MCQ items used within the assessments have not changed over time. AIMS By using the results of MCQ assessments over multiple years this research aims to determine if there has been a significant change in the average ability of student cohorts from different years. DESIGN AND METHODS Using a combination of item stacking and Rasch analysis it is possible to compare the difficulty of the MCQ items and the ability of the students over multiple years by including them on the same relative scale. This then allows for the item difficulties to be compared to ensure that the items are performing the same way each year that they are used. If the item difficulties are not significantly different then any changes in assessment results must be due to changes in the ability of the student cohort(s), and so the ability of the students can then be compared through statistical analysis to determine if there are any significant changes over multiple years. RESULTS By comparing the student results over a period of four years (2012 - 2015) in four different first-year chemistry courses it was observed that none of the MCQ items that were used over multiple years showed significant differences in their difficulty, so any changes in assessment results must be a result of changes in the ability of the student cohort(s). It was seen that in two of the four courses, the 2012 cohort showed significant differences to the other yearly cohorts. However, one of these cohorts showed significantly higher student ability, while the other cohort showed significantly lower student ability. No other instances of a significant ability difference between the cohorts was observed in this analysis. CONCLUSIONS Only two of the sixteen cohorts analysed showed a significant difference between their average ability and the average ability of the cohorts from other years participating in the same course. In addition the two cohorts that did show significance differences (both from 2012) did not follow the same trend as each other, meaning it is unreasonable to infer any potential trends from these results alone. At this point, the results of this analysis suggest that the ability of the students has not significantly changed between years. Further work using MCQ assessment results from additional cohorts will be carried out to investigate whether the 2012 cohort results are in fact anomalous

    ASELL : the advancing science by enhancing learning in the laboratory project

    Full text link
    Most science educators and researchers will agree that the laboratory experience ranks as a major factor that influences students&rsquo; attitudes to their science courses. Consequently, good laboratory programs should play a major role in influencing student learning and performance. The laboratory program can be pivotal in defining a student\u27s experience in the sciences, and if done poorly, can be a major contributing factor in causing disengagement from the subject area. The challenge remains to provide students with laboratory activities that are relevant, engaging and offer effective learning opportunities

    Why are we still teaching the way we were taught in the 1980s?

    Full text link
    The article discusses project funded by the Australian Learning and Teaching Council which is designed to establish excellence in science learning and teaching in Australian universities. Six universities across the country were chosen for the initial implementation of the project. According to the article, during the duration of the project, leaders will undergo leadership training to equip them with skills necessary to foster change in classroom practices.<br /
    • …
    corecore